Decomposable norm minimization with proximal-gradient homotopy algorithm

نویسندگان

  • Reza Eghbali
  • Maryam Fazel
چکیده

We study the convergence rate of the proximal-gradient homotopy algorithm applied to normregularized linear least squares problems, for a general class of norms. The homotopy algorithm reduces the regularization parameter in a series of steps, and uses a proximal-gradient algorithm to solve the problem at each step. Proximal-gradient algorithm has a linear rate of convergence given that the objective function is strongly convex, and the gradient of the smooth component of the objective function is Lipschitz continuous. In many applications, the objective function in this type of problem is not strongly convex, especially when the problem is high-dimensional and regularizers are chosen that induce sparsity or low-dimensionality. We show that if the linear sampling matrix satisfies certain assumptions and the regularizing norm is decomposable, proximal-gradient homotopy algorithm converges with a linear rate even though the objective function is not strongly convex. Our result generalizes results on the linear convergence of homotopy algorithm for l1-regularized least squares problems. Numerical experiments are presented that support the theoretical convergence rate analysis.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Efficient k-Support-Norm Regularized Minimization via Fully Corrective Frank-Wolfe Method

The k-support-norm regularized minimization has recently been applied with success to sparse prediction problems. The proximal gradient method is conventionally used to minimize this composite model. However it tends to suffer from expensive iteration cost thus the model solving could be time consuming. In our work, we reformulate the k-support-norm regularized formulation into a constrained fo...

متن کامل

A Review of Fast `1-minimization Algorithms for Robust Face Recognition

`1-minimization refers to finding the minimum `1-norm solution to an underdetermined linear system b = Ax. It has recently received much attention, mainly motivated by the new compressive sensing theory that shows under quite general conditions the minimum `1-norm solution is also the sparsest solution to the system of linear equations. Although the underlying problem is a linear program, conve...

متن کامل

An implementable proximal point algorithmic framework for nuclear norm minimization

The nuclear norm minimization problem is to find a matrix with the minimum nuclear norm subject to linear and second order cone constraints. Such a problem often arises from the convex relaxation of a rank minimization problem with noisy data, and arises in many fields of engineering and science. In this paper, we study inexact proximal point algorithms in the primal, dual and primal-dual forms...

متن کامل

A Review of Fast l1-Minimization Algorithms for Robust Face Recognition

`1-minimization refers to finding the minimum `1-norm solution to an underdetermined linear system b = Ax. It has recently received much attention, mainly motivated by the new compressive sensing theory that shows that under quite general conditions the minimum `1-norm solution is also the sparsest solution to the system of linear equations. Although the underlying problem is a linear program, ...

متن کامل

An accelerated proximal gradient algorithm for nuclear norm regularized linear least squares problems

The affine rank minimization problem, which consists of finding a matrix of minimum rank subject to linear equality constraints, has been proposed in many areas of engineering and science. A specific rank minimization problem is the matrix completion problem, in which we wish to recover a (low-rank) data matrix from incomplete samples of its entries. A recent convex relaxation of the rank minim...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:
  • Comp. Opt. and Appl.

دوره 66  شماره 

صفحات  -

تاریخ انتشار 2017